118 research outputs found

    Spatial modeling of extreme snow depth

    Full text link
    The spatial modeling of extreme snow is important for adequate risk management in Alpine and high altitude countries. A natural approach to such modeling is through the theory of max-stable processes, an infinite-dimensional extension of multivariate extreme value theory. In this paper we describe the application of such processes in modeling the spatial dependence of extreme snow depth in Switzerland, based on data for the winters 1966--2008 at 101 stations. The models we propose rely on a climate transformation that allows us to account for the presence of climate regions and for directional effects, resulting from synoptic weather patterns. Estimation is performed through pairwise likelihood inference and the models are compared using penalized likelihood criteria. The max-stable models provide a much better fit to the joint behavior of the extremes than do independence or full dependence models.Comment: Published in at http://dx.doi.org/10.1214/11-AOAS464 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Long-term changes in annual maximum snow depth and snowfall in Switzerland based on extreme value statistics

    Get PDF
    Mountain snow cover is an important source of water and essential for winter tourism in Alpine countries. However, large amounts of snow can lead to destructive avalanches, floods, traffic interruptions or even the collapse of buildings. We use annual maximum snow depth and snowfall data from 25 stations (between 200 and 2,500m) collected during the last 80 winters (1930/31 to 2009/2010) to highlight temporal trends of annual maximum snow depth and 3-day snowfall sum. The generalized extreme value (GEV) distribution with time as a covariate is used to assess such trends. It allows us in particular to infer how return levels and return periods have been modified during the last 80years. All the stations, even the highest one, show a decrease in extreme snow depth, which is mainly significant at low altitudes (below 800m). A negative trend is also observed for extreme snowfalls at low and high altitudes but the pattern at mid-altitudes (between 800 and 1,500m) is less clear. The decreasing trend of extreme snow depth and snowfall at low altitudes seems to be mainly caused by a reduction in the magnitude of the extremes rather than the scale (variability) of the extremes. This may be caused by the observed decrease in the snow/rain ratio due to increasing air temperatures. In contrast, the decreasing trend in extreme snow depth above 1,500m is caused by a reduction in the scale (variability) of the extremes and not by a reduction in the magnitude of the extremes. However, the decreasing trends are significant for only about half of the stations and can only be seen as an indication that climate change may be already impacting extreme snow depth and extreme snowfal

    Hardness of chemically densified Yellow birch in relation to wood density, polymer content and polymer properties

    Get PDF
    Density of wood can be increased by filling its porous structure with polymers. Such densification processes aim to increase hardness of wood and are particularly interesting for flooring applications. This study aims to evaluate efficiency of different polymers for chemical densification based on the polymer properties. Yellow birch (Betula alleghaniensis Britt.) was chemically densified with seven monomer mixtures through acrylate monomer impregnation and electron beam in-situ polymerization. Chemical retention and polymer content of densified woods were recorded. Hardness of treated and untreated Yellow birch was measured and compared to hardness of Jatoba (Hymenaea courbaril L.). All densified woods showed higher or comparable hardness to Jatoba. Hardness of densified wood was analyzed in relation to initial density of wood and polymer content of the material using multivariable linear mixed models. Efficiency of polymers for chemical densification was evaluated through effect of polymer content on hardness with interaction coefficients. Polymer films corresponding to monomer impregnating mixtures were prepared through low energy electron beam and characterized by their glass transition temperature, micro hardness, indentation modulus and crosslinking density. Polymers showed statistically significantly different efficiencies and were separated in two main groups. Overall, polymer efficiency increased with increasing glass transition temperature of polyacrylates

    SpaCEM3: a software for biological module detection when data is incomplete, high dimensional and dependent

    Get PDF
    Summary: Among classical methods for module detection, SpaCEM3 provides ad hoc algorithms that were shown to be particularly well adapted to specific features of biological data: high-dimensionality, interactions between components (genes) and integrated treatment of missingness in observations. The software, currently in its version 2.0, is developed in C++ and can be used either via command line or with the GUI under Linux and Windows environments. Availability: The SpaCEM3 software, a documentation and datasets are available from http://spacem3.gforge.inria.fr/. Contact: [email protected]; [email protected]

    Triplet Markov fields for the classification of complex structure data

    Get PDF
    We address the issue of classifying complex data. We focus on three main sources of complexity, namely the high dimensionality of the observed data, the dependencies between these observations and the general nature of the noise model underlying their distribution. We investigate the recent \textit{Triplet Markov Fields} and propose new models in this class that can model such data and handle the additional inclusion of a learning step in a consistent way. One advantage of our models is that their estimation can be carried out using state-of-the-art Bayesian clustering techniques. As generative models, they can be seen as an alternative, in the supervised case, to discriminative Conditional Random Fields. Identifiability issues and possible phase transition phenomena underlying the models in the non supervised case, are discussed while the models performance is illustrated on real data exhibiting the mentioned various sources of complexity

    Development and validation of high-density SNP array in ducks

    Get PDF
    Development and validation of high-density SNP array in ducks. XIth European symposium on poultry genetics (ESPG

    Appréhender le stock de métal monnayé au ive siècle après J.-C.

    Get PDF
    Parmi les thématiques abordées en numismatique à l’aide de l’archéométrie, les recherches spécifiques sur le stock de métal monnayé au ive siècle s’avèrent fascinantes mais délicates. Afflux de nouvelles ressources métalliques, refontes volontaires de monnaies : tous ces phénomènes sont de grande conséquence pour les monnayages produits dans les différents ateliers de l’Empire romain tardif. Certaines données de ce type sont mesurables. Les alliages métalliques (donc ici monétaires) contiennent en effet des éléments mineurs ou en traces, constituant la « carte d’identité » de l’approvisionnement des ateliers monétaires. Il y a 10 ans Isabelle Bollard et Jean-Noël Barrandon réunissaient plusieurs analyses de monnaies en alliage cuivreux émises au cours du ive siècle. Nos propres observations, obtenues dans le cadre d’un projet de recherche financé par la région Normandie (intitulé FANUM), sur 14 528 monnaies (nummi) de la période constantinienne contenues dans un trésor découvert fortuitement à Saint-Germain-de-Varreville (Manche), nous ont incités à reprendre, de façon plus approfondie, l’étude des alliages utilisés au cours de cette période, en portant notamment notre attention sur les teneurs en or et en argent. En effet, la grande variété des ateliers représentés au sein du trésor favorisait idéalement une telle enquête. Le choix de la fluorescence X portable a permis d’élaborer un protocole d’analyse précis et adapté à un grand nombre de monnaies. L’exploitation de 774 analyses réalisées sur des nummi frappés dans les ateliers occidentaux a permis de compléter les travaux antérieurs, mais également de mettre en évidence l’utilisation de stocks métalliques différents à partir des années 320.Among the thematics studied in numismatics using archeometry, specific research on the stock of metal struck in the 4th century turns out to be fascinating but complex. Influx of new metallic resources, voluntary recasts of coins: these phenomena have great consequences for the coinage produced in all the workshops throughout the late Roman Empire. Some of these data can be measurable. Metallic alloys (coins in this case) contain minor or trace elements that constitute a kind of “identity card” for the supply of monetary workshops. Ten years ago, Isabelle Bollard and Jean-Noël Barrandon collected several analyzes of bronzes struck during the 4th century. Our own observations, obtained within the FANUM project, funded by the Normandy region, on 14 528 coins (nummi) of the Constantinian period, contained in a fortuitously discovered treasure in Saint-Germain-de-Varreville (Manche) have motivated a thorough new study of the alloys used during this period, focusing on the traces of gold and silver. Indeed, the variety of workshops represented in the treasure has encouraged such an inquiry. The choice of portable X-ray fluorescence has enabled to develop a protocol of analysis adapted to a large number of coins. The use of 774 analyzes carried out on nummi struck in the Western workshops has completed previous works, but also enabled to highlight the use of different metal stocks from the 320s
    • …
    corecore